Symmetrized Bregman Divergences and Metrics
نویسندگان
چکیده
While Bregman divergences [3] have been used for several machine learning problems in recent years, the facts that they are asymmetric and does not satisfy triangle inequality have been a major limitation. In this paper, we investigate the relationship between two families of symmetrized Bregman divergences and metrics, which satisfy the triangle inequality. Further, we investigate kmeans-type clustering problems using both families of symmetrized divergences, and give efficient algorithms for the same. The first family, called Generalized Symmetrized Bregman (GSB) divergences, can be derived from any well-behaved convex function. In particular, if φ is a convex function of Legendre type [5], the GSB divergence can be defined as:
منابع مشابه
Bregman Divergences and Triangle Inequality
While Bregman divergences have been used for clustering and embedding problems in recent years, the facts that they are asymmetric and do not satisfy triangle inequality have been a major concern. In this paper, we investigate the relationship between two families of symmetrized Bregman divergences and metrics, which satisfy the triangle inequality. The first family can be derived from any well...
متن کاملOn the Centroids of Symmetrized Bregman Divergences
In this paper, we generalize the notions of centroids and barycenters to the broad class of information-theoretic distortion measures called Bregman divergences. Bregman divergences are versatile, and unify quadratic geometric distances with various statistical entropic measures. Because Bregman divergences are typically asymmetric, we consider both the left-sided and right-sided centroids and ...
متن کاملThe Sided and Symmetrized Bregman Centroids
We generalize the notions of centroids (and barycenters) to the broad class of information-theoretic distortion measures called Bregman divergences. Bregman divergences form a rich and versatile family of distances that unifies quadratic Euclidean distances with various well-known statistical entropic measures. Since besides the squared Euclidean distance, Bregman divergences are asymmetric, we...
متن کاملMetrics Defined by Bregman Divergences †
Bregman divergences are generalizations of the well known Kullback Leibler divergence. They are based on convex functions and have recently received great attention. We present a class of “squared root metrics” based on Bregman divergences. They can be regarded as natural generalization of Euclidean distance. We provide necessary and sufficient conditions for a convex function so that the squar...
متن کاملLog-Determinant Divergences Revisited: Alpha-Beta and Gamma Log-Det Divergences
In this paper, we review and extend a family of log-det divergences for symmetric positive definite (SPD) matrices and discuss their fundamental properties. We show how to generate from parameterized Alpha-Beta (AB) and Gamma log-det divergences many well known divergences, for example, the Stein’s loss, S-divergence, called also Jensen-Bregman LogDet (JBLD) divergence, the Logdet Zero (Bhattac...
متن کامل